Faster Principal Component Regression and Stable Matrix Chebyshev Approximation

نویسندگان

  • Zeyuan Allen-Zhu
  • Yuanzhi Li
چکیده

We solve principal component regression (PCR), up to a multiplicative accuracy 1+γ, by reducing the problem to Õ(γ−1) black-box calls of ridge regression. Therefore, our algorithm does not require any explicit construction of the top principal components, and is suitable for large-scale PCR instances. In contrast, previous result requires Õ(γ−2) such black-box calls. We obtain this result by developing a general stable recurrence formula for matrix Chebyshev polynomials, and a degree-optimal polynomial approximation to the matrix sign function. Our techniques may be of independent interests, especially when designing iterative methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A fractional type of the Chebyshev polynomials for approximation of solution of linear fractional differential equations

In this paper we introduce a type of fractional-order polynomials based on the classical Chebyshev polynomials of the second kind (FCSs). Also we construct the operational matrix of fractional derivative of order $ gamma $ in the Caputo for FCSs and show that this matrix with the Tau method are utilized to reduce the solution of some fractional-order differential equations.

متن کامل

Faster Principal Component Regression via Optimal Polynomial Approximation to sgn(x)

We solve principle component regression (PCR) by providing an efficient algorithm to project any vector onto the subspace formed by the top principle components of a matrix. Our algorithm does not require any explicit construction of the top principle components, and therefore is suitable for large-scale PCR instances. Specifically, to project onto the subspace formed by principle components wi...

متن کامل

Solving high-order partial differential equations in unbounded domains by means of double exponential second kind Chebyshev approximation

In this paper, a collocation method for solving high-order linear partial differential equations (PDEs) with variable coefficients under more general form of conditions is presented. This method is based on the approximation of the truncated double exponential second kind Chebyshev (ESC) series. The definition of the partial derivative is presented and derived as new operational matrices of der...

متن کامل

Chebyshev Approximations to the Histogram χ Kernel

The random Fourier features methodology can be used to approximate the performance of kernel classifiers in linear time in the number of training examples. However, there still exists a non-trivial performance gap between the approximation and the nonlinear kernel classifiers, especially for the exponential χ kernel, one of the most powerful models for histograms. Based on analogies with Chebys...

متن کامل

Principal Component Projection Without Principal Component Analysis

We show how to efficiently project a vector onto the top principal components of a matrix, without explicitly computing these components. Specifically, we introduce an iterative algorithm that provably computes the projection using few calls to any black-box routine for ridge regression. By avoiding explicit principal component analysis (PCA), our algorithm is the first with no runtime dependen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017